85 research outputs found
Open TURNS: An industrial software for uncertainty quantification in simulation
The needs to assess robust performances for complex systems and to answer
tighter regulatory processes (security, safety, environmental control, and
health impacts, etc.) have led to the emergence of a new industrial simulation
challenge: to take uncertainties into account when dealing with complex
numerical simulation frameworks. Therefore, a generic methodology has emerged
from the joint effort of several industrial companies and academic
institutions. EDF R&D, Airbus Group and Phimeca Engineering started a
collaboration at the beginning of 2005, joined by IMACS in 2014, for the
development of an Open Source software platform dedicated to uncertainty
propagation by probabilistic methods, named OpenTURNS for Open source Treatment
of Uncertainty, Risk 'N Statistics. OpenTURNS addresses the specific industrial
challenges attached to uncertainties, which are transparency, genericity,
modularity and multi-accessibility. This paper focuses on OpenTURNS and
presents its main features: openTURNS is an open source software under the LGPL
license, that presents itself as a C++ library and a Python TUI, and which
works under Linux and Windows environment. All the methodological tools are
described in the different sections of this paper: uncertainty quantification,
uncertainty propagation, sensitivity analysis and metamodeling. A section also
explains the generic wrappers way to link openTURNS to any external code. The
paper illustrates as much as possible the methodological tools on an
educational example that simulates the height of a river and compares it to the
height of a dyke that protects industrial facilities. At last, it gives an
overview of the main developments planned for the next few years
Quadrature Strategies for Constructing Polynomial Approximations
Finding suitable points for multivariate polynomial interpolation and
approximation is a challenging task. Yet, despite this challenge, there has
been tremendous research dedicated to this singular cause. In this paper, we
begin by reviewing classical methods for finding suitable quadrature points for
polynomial approximation in both the univariate and multivariate setting. Then,
we categorize recent advances into those that propose a new sampling approach
and those centered on an optimization strategy. The sampling approaches yield a
favorable discretization of the domain, while the optimization methods pick a
subset of the discretized samples that minimize certain objectives. While not
all strategies follow this two-stage approach, most do. Sampling techniques
covered include subsampling quadratures, Christoffel, induced and Monte Carlo
methods. Optimization methods discussed range from linear programming ideas and
Newton's method to greedy procedures from numerical linear algebra. Our
exposition is aided by examples that implement some of the aforementioned
strategies
Stochastic-Expansions-Based Model-Assisted Probability of Detection Analysis of the Spherically-Void-Defect Benchmark Problem
Probability of detection (POD) is used for reliability analysis in nondestructive testing (NDT) area. Traditionally, it is determined by experimental tests, while it can be enhanced by physics-based simulation models, which is called model-assisted probability of detection (MAPOD). However, accurate physics-based models are usually expensive in time. In this paper, we implement a type of stochastic polynomial chaos expansions (PCE), as alternative of actual physics-based model for the MAPOD calculation. State-of-the-art least-angle regression method and hyperbolic sparse technique are integrated within PCE construction. The proposed method is tested on a spherically-void-defect benchmark problem, developed by the World Federal Nondestructive Evaluation Center. The benchmark problem is added with two uncertainty parameters, where the PCE model usually requires about 100 sample points for the convergence on statistical moments, while direct Monte Carlo method needs more than 10000 samples, and Kriging based Monte Carlo method is oscillating. With about 100 sample points, PCE model can reduce root mean square error to be within 1% standard deviation of test points, while Kriging model cannot reach that level of accuracy even with 200 sample points
Polynomial Chaos and Collocation Methods and Their Range of Applicability
International audienceIn this chapter the different polynomial chaos and stochastic collocation methodologies used within the UMRIDA project are compared. Guidelines for their use and applicability are formulated
The Meiji restoration and subsequent factors in Japanese economic success
SIGLEAvailable from Bibliothek des Instituts fuer Weltwirtschaft, ZBW, Duesternbrook Weg 120, D-24105 Kiel C 141129 / FIZ - Fachinformationszzentrum Karlsruhe / TIB - Technische InformationsbibliothekDEGerman
- …